182 research outputs found

    Improvement of the Davey-MacKay Construction

    Get PDF
    The Davey-MacKay construction is a deletion-insertion correcting code scheme consisting of an inner code that functions as a pilot sequence to which the receiver seeks to synchronize, and an outer code that provides error protection. We analyse the performance of the inner code in isolation, arguing that these codes provide unequal protection, and demonstrate empirically that the error rate is dependent on the date symbol values. We also propose modifications to the code construction that alleviate this asymmetry. Simulation results show that these codes have an improved performance with no penalty

    Codebook and marker sequence design for synchronization-correcting codes

    Get PDF
    We propose a construction based on synchronization and error-correcting block codes and a matched marker sequence. The block codes can correct insertion, deletion and substitution errors within each codeword. The marker sequence allows the decoder to maintain synchronization at codeword boundaries even at high error rates. An upper bound is given for the performance of these codes over a channel with random substitutions and synchronization errors. It is shown that the performance is largely dependent on the code's minimum Levenshtein distance. The performance of these codes is verified by simulation and compared to published results. In concatenation with a non-binary outer code we obtain a significant improvement in frame error rate at similar overall code rates

    An Improved Decoding Algorithm for the Davey-MacKay Construction

    Get PDF
    The Deletion-Insertion Correcting Code construction proposed by Davey and MacKay consists of an inner code that recovers synchronization and an outer code that provides substitution error protection. The inner code uses low-weight codewords which are added (modulo two) to a pilot sequence. The receiver is able to synchronise on the pilot sequence in spite of the changes introduced by the added codeword. The original bit-level formulation of the inner decoder assumes that all bits in the sparse codebook are identically and independently distributed. Not only is this assumption inaccurate, but it also prevents the use of soft a- priori input to the decoder. We propose an alternative symbol-level inner decoding algorithm that takes the actual codebook into account. Simulation results show that the proposed algorithm has an improved performance with only a small penalty in complexity, and it allows other improvements using inner codes with larger minimum distance

    Forensic data hiding optimized for JPEG 2000

    Get PDF
    This paper presents a novel image adaptive data hiding system using properties of the discrete wavelet transform and which is ready to use in combination with JPEG 2000. Image adaptive watermarking schemes determine the embedding samples and strength from the image statistics. We propose to use the energy of wavelet coefficients at high frequencies to measure the amount of distortion that can be tolerated by a lower frequency coefficient. The watermark decoder in image adaptive data hiding needs to estimate the same parameters used for encoding from a modified source and hence is vulnerable to desynchronization. We present a novel way to resolve these synchronization issues by employing specialized insertion, deletion and substitution codes. Given the low complexity and reduced perceptual impact of the embedding technique, it is suitable for inserting camera and/or projector information to facilitate image forensics

    Evolutionary multi-objective optimization of trace transform for invariant feature extraction

    Get PDF
    Trace transform is one representation of images that uses different functionals applied on the image function. When the functional is integral, it becomes identical to the well-known Radon transform, which is a useful tool in computed tomography medical imaging. The key question in Trace transform is to select the best combination of the Trace functionals to produce the optimal triple feature, which is a challenging task. In this paper, we adopt a multi-objective evolutionary algorithm adapted from the elitist non-dominated sorting genetic algorithm (NSGA-II), an evolutionary algorithm that has shown to be very efficient for multi-objective optimization, to select the best functionals as well as the optimal number of projections used in Trace transform to achieve invariant image identification. This is achieved by minimizing the within-class variance and maximizing the between-class variance. To enhance the computational efficiency, the Trace parameters are calculated offline and stored, which are then used to calculate the triple features in the evolutionary optimization. The proposed Evolutionary Trace Transform (ETT) is empirically evaluated on various images from fish database. It is shown that the proposed algorithm is very promising in that it is computationally efficient and considerably outperforms existing methods in literature

    Formal security analysis of NFC M-coupon protocols using Casper/FDR

    Get PDF
    Near field communication (NFC) is a standard-based, radio frequency (RF), wireless communication technology that allows data to be exchanged between devices that are less than 20 cm apart. NFC security protocols require formal security analysis before massive adoptions, in order to check whether these protocols meet its requirements and goals. In this paper we formally analyse NFC-based mobile coupon protocols using formal methods (Casper/FDR). We find an attack against the advanced protocol, and then we provide a solution that addresses the vulnerability formally

    Integrating personality research and animal contest theory: aggressiveness in the green swordtail <i>Xiphophorus helleri</i>

    Get PDF
    &lt;p&gt;Aggression occurs when individuals compete over limiting resources. While theoretical studies have long placed a strong emphasis on context-specificity of aggression, there is increasing recognition that consistent behavioural differences exist among individuals, and that aggressiveness may be an important component of individual personality. Though empirical studies tend to focus on one aspect or the other, we suggest there is merit in modelling both within-and among-individual variation in agonistic behaviour simultaneously. Here, we demonstrate how this can be achieved using multivariate linear mixed effect models. Using data from repeated mirror trials and dyadic interactions of male green swordtails, &lt;i&gt;Xiphophorus helleri&lt;/i&gt;, we show repeatable components of (co)variation in a suite of agonistic behaviour that is broadly consistent with a major axis of variation in aggressiveness. We also show that observed focal behaviour is dependent on opponent effects, which can themselves be repeatable but were more generally found to be context specific. In particular, our models show that within-individual variation in agonistic behaviour is explained, at least in part, by the relative size of a live opponent as predicted by contest theory. Finally, we suggest several additional applications of the multivariate models demonstrated here. These include testing the recently queried functional equivalence of alternative experimental approaches, (e. g., mirror trials, dyadic interaction tests) for assaying individual aggressiveness.&lt;/p&gt

    Modular prevention of heart disease following acute coronary syndrome (ACS) [ISRCTN42984084]

    Get PDF
    BACKGROUND: Coronary heart disease (CHD) is a major cause of morbidity and mortality in Australia and it is recommended that all persons with unstable angina (UA) or myocardial infarction (MI) participate in secondary prevention as offered in cardiac rehabilitation (CR) programs. However, the majority of patients do not access standard CR and have higher baseline coronary risk and poorer knowledge of CHD than those persons due to commence CR. The objective of this study is to investigate whether a modular guided self-choice approach to secondary prevention improves coronary risk profile and knowledge in patients who do not access standard CR. METHODS/DESIGN: This randomised controlled trial with one year follow-up will be conducted at a tertiary referral hospital. Participants eligible for but not accessing standard CR will be randomly allocated to either a modular or conventional care group. Modular care will involve participation in individualised modules that involve choice, goal-setting and coaching. Conventional care will involve ongoing heart disease management as directed by the participant's doctors. Both modular and conventional groups will be compared with a contemporary reference group of patients attending CR. Outcomes include measured modifiable risk factors, relative heart disease risk and knowledge of risk factors. DISCUSSION: We present the rationale and design of a randomised controlled trial testing a modular approachfor the secondary prevention of coronary heart disease following acute coronary syndrome

    Reconstructing 800 years of summer temperatures in Scotland from tree rings

    Get PDF
    We thank The Carnegie Trust for the Universities of Scotland for providing funding for Miloš Rydval’s PhD. The Scottish pine network expansion has been an ongoing task since 2007 and funding must be acknowledged to the following projects: EU project ‘Millennium’ (017008-2), Leverhulme Trust project ‘RELiC: Reconstructing 8000 years of Environmental and Landscape change in the Cairngorms (F/00 268/BG)’ and the NERC project ‘SCOT2K: Reconstructing 2000 years of Scottish climate from tree rings (NE/K003097/1)’.This study presents a summer temperature reconstruction using Scots pine tree-ring chronologies for Scotland allowing the placement of current regional temperature changes in a longer-term context. ‘Living-tree’ chronologies were extended using ’subfossil’ samples extracted from nearshore lake sediments resulting in a composite chronology > 800 years in length. The North Cairngorms (NCAIRN) reconstruction was developed from a set of composite blue intensity high-pass and ring-width low-pass chronologies with a range of detrending and disturbance correction procedures. Calibration against July-August mean temperature explains 56.4% of the instrumental data variance over 1866-2009 and is well verified. Spatial correlations reveal strong coherence with temperatures over the British Isles, parts of western Europe, southern Scandinavia and northern parts of the Iberian Peninsula. NCAIRN suggests that the recent summer-time warming in Scotland is likely not unique when compared to multi-decadal warm periods observed in the 1300s, 1500s, and 1730s, although trends before the mid-16th century should be interpreted with some caution due to greater uncertainty. Prominent cold periods were identified from the 16th century until the early 1800s – agreeing with the so-called Little Ice Age observed in other tree-ring reconstructions from Europe - with the 1690s identified as the coldest decade in the record. The reconstruction shows a significant cooling response one year following volcanic eruptions although this result is sensitive to the datasets used to identify such events. In fact, the extreme cold (and warm) years observed in NCAIRN appear more related to internal forcing of the summer North Atlantic Oscillation.Publisher PDFPeer reviewe
    • …
    corecore